From tensor-network quantum states to tensorial recurrent neural networks
نویسندگان
چکیده
A recurrent neural network with a linear memory update is proposed to exactly represent any matrix product state (MPS) and further generalized 2D lattices using multilinear update. It supports perfect sampling wave-function evaluation in polynomial time, provides exact representation of an area law entanglement entropy, outperforms MPS by orders magnitude parameter efficiency.
منابع مشابه
Unifying neural-network quantum states and correlator product states via tensor networks
Correlator product states (CPS) are a powerful and very broad class of states for quantum lattice systems whose (unnormalised) amplitudes in a fixed basis can be sampled exactly and efficiently. They work by gluing together states of overlapping clusters of sites on the lattice, called correlators. Recently Carleo and Troyer (2017 Science 355 602) introduced a new type sampleable ansatz called ...
متن کاملUnifying neural-network quantum states and correlator product states via tensor networks
Correlator product states (CPS) are a powerful and very broad class of states for quantum lattice systems whose (unnormalised) amplitudes in a fixed basis can be sampled exactly and efficiently. They work by gluing together states of overlapping clusters of sites on the lattice, called correlators. Recently Carleo and Troyer (2017 Science 355 602) introduced a new type sampleable ansatz called ...
متن کاملRestricted Recurrent Neural Tensor Networks
Increasing the capacity of recurrent neural networks (RNN) usually involves augmenting the size of the hidden layer, resulting in a significant increase of computational cost. An alternative is the recurrent neural tensor network (RNTN), which increases capacity by employing distinct hidden layer weights for each vocabulary word. However, memory usage scales linearly with vocabulary size, which...
متن کاملTensorial Recurrent Neural Networks for Longitudinal Data Analysis
Traditional Recurrent Neural Networks assume vectorized data as inputs. However many data from modern science and technology come in certain structures such as tensorial time series data. To apply the recurrent neural networks for this type of data, a vectorisation process is necessary, while such a vectorisation leads to the loss of the precise information of the spatial or longitudinal dimens...
متن کاملTensor Decomposition for Compressing Recurrent Neural Network
In the machine learning fields, Recurrent Neural Network (RNN) has become a popular algorithm for sequential data modeling. However, behind the impressive performance, RNNs require a large number of parameters for both training and inference. In this paper, we are trying to reduce the number of parameters and maintain the expressive power from RNN simultaneously. We utilize several tensor decom...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Physical review research
سال: 2023
ISSN: ['2643-1564']
DOI: https://doi.org/10.1103/physrevresearch.5.l032001